Close

1. Identity statement
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Holder Codeibi 8JMKD3MGPEW34M/46T9EHH
Identifier8JMKD3MGPEW34M/45CUTES
Repositorysid.inpe.br/sibgrapi/2021/09.06.22.34
Last Update2021:09.06.22.53.01 (UTC) administrator
Metadata Repositorysid.inpe.br/sibgrapi/2021/09.06.22.34.57
Metadata Last Update2022:09.10.00.16.17 (UTC) administrator
DOI10.1109/SIBGRAPI54419.2021.00011
Citation KeyPontiSantRibeCava:2021:AvPiGo
TitleTraining Deep Networks from Zero to Hero: avoiding pitfalls and going beyond
FormatOn-line
Year2021
Access Date2024, May 04
Number of Files1
Size1275 KiB
2. Context
Author1 Ponti, Moacir Antonelli
2 Santos, Fernando Pereira dos
3 Ribeiro, Leo Sampaio Ferraz
4 Cavallari, Gabriel Biscaro
Affiliation1 Universidade de São Paulo 
2 Universidade de São Paulo 
3 Universidade de São Paulo 
4 Universidade de São Paulo
EditorPaiva, Afonso
Menotti, David
Baranoski, Gladimir V. G.
Proença, Hugo Pedro
Junior, Antonio Lopes Apolinario
Papa, João Paulo
Pagliosa, Paulo
dos Santos, Thiago Oliveira
e Sá, Asla Medeiros
da Silveira, Thiago Lopes Trugillo
Brazil, Emilio Vital
Ponti, Moacir A.
Fernandes, Leandro A. F.
Avila, Sandra
e-Mail Addressmoacirponti@gmail.com
Conference NameConference on Graphics, Patterns and Images, 34 (SIBGRAPI)
Conference LocationGramado, RS, Brazil (virtual)
Date18-22 Oct. 2021
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Book TitleProceedings
Tertiary TypeTutorial
History (UTC)2021-09-06 22:53:01 :: moacirponti@gmail.com -> administrator :: 2021
2022-03-03 04:41:59 :: administrator -> menottid@gmail.com :: 2021
2022-03-03 12:30:25 :: menottid@gmail.com -> administrator :: 2021
2022-09-10 00:16:17 :: administrator -> :: 2021
3. Content and structure
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Version Typefinaldraft
KeywordsDeep Learning
Convolutional Networks
Survey
Training
AbstractTraining deep neural networks may be challenging in real world data. Using models as black-boxes, even with transfer learning, can result in poor generalization or inconclusive results when it comes to small datasets or specific applications. This tutorial covers the basic steps as well as more recent options to improve models, in particular, but not restricted to, supervised learning. It can be particularly useful in datasets that are not as well-prepared as those in challenges, and also under scarce annotation and/or small data. We describe basic procedures as data preparation, optimization and transfer learning, but also recent architectural choices such as use of transformer modules, alternative convolutional layers, activation functions, wide/depth, as well as training procedures including curriculum, contrastive and self-supervised learning.
Arrangementurlib.net > SDLA > Fonds > SIBGRAPI 2021 > Training Deep Networks...
doc Directory Contentaccess
source Directory Contentthere are no files
agreement Directory Content
agreement.html 06/09/2021 19:34 1.3 KiB 
4. Conditions of access and use
data URLhttp://urlib.net/ibi/8JMKD3MGPEW34M/45CUTES
zipped data URLhttp://urlib.net/zip/8JMKD3MGPEW34M/45CUTES
Languageen
Target File2021_sibgrapi__tutorial_CR.pdf
User Groupmoacirponti@gmail.com
Visibilityshown
Update Permissionnot transferred
5. Allied materials
Mirror Repositorysid.inpe.br/banon/2001/03.30.15.38.24
Next Higher Units8JMKD3MGPEW34M/45PQ3RS
Citing Item Listsid.inpe.br/sibgrapi/2021/11.12.11.46 3
sid.inpe.br/banon/2001/03.30.15.38.24 2
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
6. Notes
Empty Fieldsarchivingpolicy archivist area callnumber contenttype copyholder copyright creatorhistory descriptionlevel dissemination edition electronicmailaddress group isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder schedulinginformation secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url volume


Close